Searching for the core variables in principal components analysis

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Persian Handwriting Analysis Using Functional Principal Components

Principal components analysis is a well-known statistical method in dealing with large dependent data sets. It is also used in functional data for both purposes of data reduction as well as variation representation. On the other hand "handwriting" is one of the objects, studied in various statistical fields like pattern recognition and shape analysis. Considering time as the argument,...

متن کامل

Choosing the Best Hierarchical Clustering Technique Based on Principal Components Analysis for Suspended Sediment Load Estimation

1- INTRODUCTION The assessment of watershed sediment load is necessary for controling soil erosion and reducing the potential of sediment production. Different estimates of sediment amounts along with the lack of long-term measurements limits the accessibility to reliable data series of erosion rate and sediment yield. Therefore, the observed data of suspended sediment load could be used to ...

متن کامل

a time-series analysis of the demand for life insurance in iran

با توجه به تجزیه و تحلیل داده ها ما دریافتیم که سطح درامد و تعداد نمایندگیها باتقاضای بیمه عمر رابطه مستقیم دارند و نرخ بهره و بار تکفل با تقاضای بیمه عمر رابطه عکس دارند

Online Principal Components Analysis

We consider the online version of the well known Principal Component Analysis (PCA) problem. In standard PCA, the input to the problem is a set of ddimensional vectors X = [x1, . . . ,xn] and a target dimension k < d; the output is a set of k-dimensional vectors Y = [y1, . . . ,yn] that minimize the reconstruction error: minΦ ∑ i ‖xi − Φyi‖2. Here, Φ ∈ Rd×k is restricted to being isometric. The...

متن کامل

Principal Components Analysis

Derivation of PCA I: For a set of d-dimensional data vectors {x}i=1, the principal axes {e}qj=1 are those orthonormal axes onto which the retained variance under projection is maximal. It can be shown that the vectors ej are given by the q dominant eigenvectors of the sample covariance matrix S, such that Sej = λjej . The q principal components of the observed vector xi are given by the vector ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Brazilian Journal of Probability and Statistics

سال: 2018

ISSN: 0103-0752

DOI: 10.1214/17-bjps361